390 research outputs found

    Classification efficiencies for robust linear discriminant analysis.

    Get PDF
    Linear discriminant analysis is typically carried out using Fisher’s method. This method relies on the sample averages and covariance matrices computed from the different groups constituting the training sample. Since sample averages and covariance matrices are not robust, it has been proposed to use robust estimators of location and covariance instead, yielding a robust version of Fisher’s method. In this paper relative classification efficiencies of the robust procedures with respect to the classical method are computed. Second order influence functions appear to be useful for computing these classification efficiencies. It turns out that, when using an appropriate robust estimator, the loss in classification efficiency at the normal model remains limited. These findings are confirmed by finite sample simulations.Classification efficiency; Discriminant analysis; Error rate; Fisher rule; Influence function; Robustness;

    Robust linear discriminant analysis for multiple groups: influence and classification efficiencies.

    Get PDF
    Linear discriminant analysis for multiple groups is typically carried out using Fisher's method. This method relies on the sample averages and covariance ma- trices computed from the different groups constituting the training sample. Since sample averages and covariance matrices are not robust, it is proposed to use robust estimators of location and covariance instead, yielding a robust version of Fisher's method. In this paper expressions are derived for the influence that an observation in the training set has on the error rate of the Fisher method for multiple linear discriminant analysis. These influence functions on the error rate turn out to be unbounded for the classical rule, but bounded when using a robust approach. Using these influence functions, we compute relative classification efficiencies of the robust procedures with respect to the classical method. It is shown that, by using an appropriate robust estimator, the loss in classification efficiency at the normal model remains limited. These findings are confirmed by finite sample simulations.Classification; Covariance; Discriminant analysis; Efficiency; Error rate; Estimator; Fisher rule; Functions; Influence function; Model; Multiple groups; Research; Robustness; Simulation; Training;

    Algorithms for Projection - Pursuit robust principal component analysis.

    Get PDF
    The results of a standard principal component analysis (PCA) can be affected by the presence of outliers. Hence robust alternatives to PCA are needed. One of the most appealing robust methods for principal component analysis uses the Projection-Pursuit principle. Here, one projects the data on a lower-dimensional space such that a robust measure of variance of the projected data will be maximized. The Projection-Pursuit-based method for principal component analysis has recently been introduced in the field of chemometrics, where the number of variables is typically large. In this paper, it is shown that the currently available algorithm for robust Projection-Pursuit PCA performs poor in the presence of many variables. A new algorithm is proposed that is more suitable for the analysis of chemical data. Its performance is studied by means of simulation experiments and illustrated on some real data sets. (c) 2007 Elsevier B.V. All rights reserved.multivariate statistics; optimization; numerical precision; outliers; robustness; scale estimators; estimators; regression;

    Algorithms for projection-pursuit robust principal component analysis.

    Get PDF
    Principal Component Analysis (PCA) is very sensitive in presence of outliers. One of the most appealing robust methods for principal component analysis uses the Projection-Pursuit principle. Here, one projects the data on a lower-dimensional space such that a robust measure of variance of the projected data will be maximized. The Projection-Pursuit based method for principal component analysis has recently been introduced in the field of chemometrics, where the number of variables is typically large. In this paper, it is shown that the currently available algorithm for robust Projection-Pursuit PCA performs poor in presence of many variables. A new algorithm is proposed that is more suitable for the analysis of chemical data. Its performance is studied by means of simulation experiments and illustrated on some real datasets.Algorithms; Data; Field; IT; Methods; Outliers; Performance; Principal component analysis; Principal components analysis; Projection-pursuit; Robustness; Simulation; Space; Variables; Variance;

    A Comparison of Algorithms for the Multivariate L1-Median

    Get PDF
    The L1-median is a robust estimator of multivariate location with good statistical properties. Several algorithms for computing the L1- median are available. Problem speci c algorithms can be used, but also general optimization routines. The aim is to compare dierent algorithms with respect to their precision and runtime. This is pos- sible because all considered algorithms have been implemented in a standardized manner in the open source environment R. In most sit- uations, the algorithm based on the optimization routine NLM (non- linear minimization) clearly outperforms other approaches. Its low computation time makes applications for large and high-dimensional data feasible.Algorithm;Multivariate median;Optimization;Robustness

    Partial robust M-regression.

    Get PDF
    Partial Least Squares (PLS) is a standard statistical method in chemometrics. It can be considered as an incomplete, or 'partial', version of the Least Squares estimator of regression, applicable when high or perfect multicollinearity is present in the predictor variables. The Least Squares estimator is well-known to be an optimal estimator for regression, but only when the error terms are normally distributed. In the absence of normality, and in particular when outliers are in the data set, other more robust regression estimators have better properties. In this paper a 'partial' version of M-regression estimators will be defined. If an appropriate weighting scheme is chosen, partial M-estimators become entirely robust to any type of outlying points, and are called Partial Robust M-estimators. It is shown that partial robust M-regression outperforms existing methods for robust PLS regression in terms of statistical precision and computational speed, while keeping good robustness properties. The method is applied to a data set consisting of EPXMA spectra of archaeological glass vessels. This data set contains several outliers, and the advantages of partial robust M-regression are illustrated. Applying partial robust M-regression yields much smaller prediction errors for noisy calibration samples than PLS. On the other hand, if the data follow perfectly well a normal model, the loss in efficiency to be paid for is very small.Advantages; Applications; Calibration; Data; Distribution; Efficiency; Estimator; Least-squares; M-estimators; Methods; Model; Optimal; Ordinary least squares; Outliers; Partial least squares; Precision; Prediction; Projection-pursuit; Regression; Robust regression; Robustness; Simulation; Spectometric quantization; Squares; Studies; Variables; Yield;

    Robust continuum regression.

    Get PDF
    Several applications of continuum regression (CR) to non-contaminated data have shown that a significant improvement in predictive power can be obtained compared to the three standard techniques which it encompasses (ordinary least squares (OLS), principal component regression (PCR) and partial least squares (PLS)). For contaminated data continuum regression may yield aberrant estimates due to its non-robustness with respect to outliers. Also for data originating from a distribution which significantly differs from the normal distribution, continuum regression may yield very inefficient estimates. In the current paper, robust continuum regression (RCR) is proposed. To construct the estimator, an algorithm based on projection pursuit (PP) is proposed. The robustness and good efficiency properties of RCR are shown by means of a simulation study. An application to an X-ray fluorescence analysis of hydrometallurgical samples illustrates the method's applicability in practice.Regression; Applications; Data; Ordinary least squares; Least-squares; Squares; Partial least squares; Yield; Outliers; Distribution; Estimator; Projection-pursuit; Robustness; Efficiency; Simulation; Studies;

    Fitting multiplicative models by robust alternating regressions.

    Get PDF
    In this paper a robust approach for fitting multiplicative models is presented. Focus is on the factor analysis model, where we will estimate factor loadings and scores by a robust alternating regression algorithm. The approach is highly robust, and also works well when there are more variables than observations. The technique yields a robust biplot, depicting the interaction structure between individuals and variables. This biplot is not predetermined by outliers, which can be retrieved from the residual plot. Also provided is an accompanying robust R-2-plot to determine the appropriate number of factors. The approach is illustrated by real and artificial examples and compared with factor analysis based on robust covariance matrix estimators. The same estimation technique can fit models with both additive and multiplicative effects (FANOVA models) to two-way tables, thereby extending the median polish technique.Alternating regression; Approximation; Biplot; Covariance; Dispersion matrices; Effects; Estimator; Exploratory data analysis; Factor analysis; Factors; FANOVA; Least-squares; Matrix; Median polish; Model; Models; Outliers; Principal components; Robustness; Structure; Two-way table; Variables; Yield;
    corecore